Goto

Collaborating Authors

 learning local descriptor


HyNet: Learning Local Descriptor with Hybrid Similarity Measure and Triplet Loss

Neural Information Processing Systems

In this paper, we investigate how L2 normalisation affects the back-propagated descriptor gradients during training. Based on our observations, we propose HyNet, a new local descriptor that leads to state-of-the-art results in matching. HyNet introduces a hybrid similarity measure for triplet margin loss, a regularisation term constraining the descriptor norm, and a new network architecture that performs L2 normalisation of all intermediate feature maps and the output descriptors. HyNet surpasses previous methods by a significant margin on standard benchmarks that include patch matching, verification, and retrieval, as well as outperforming full end-to-end methods on 3D reconstruction tasks.


Review for NeurIPS paper: HyNet: Learning Local Descriptor with Hybrid Similarity Measure and Triplet Loss

Neural Information Processing Systems

Weaknesses: - Contributions that are claimed are somewhat weak. In fact, I think that main contribution is in the gradient analysis (as I mentioned in Strengths) and all of the three claimed contributions can be bundled into one "incremental improvements of previous works". Namely: C1: L2-regularization is interesting, but ablation study shows it has the smallest effect on the performance; C2: hybrid similarity measure is a simple combination of two established similarity measures, and it additionally adds another hyper-parameter (alpha) that seems to be very sensitive to setup (Fig5(a) shows that setting lower alpha reduces performance by 1mAP, and setting higher reduces by 0.5 mAP); C3: novel architecture is actually almost identical architecture as [21,22] with addition of FRN block from [36] (ablation study shows this gives the most increase to performance), so it more of a practical combination of previous work than actual contribution.


Review for NeurIPS paper: HyNet: Learning Local Descriptor with Hybrid Similarity Measure and Triplet Loss

Neural Information Processing Systems

I have read the reviews, the rebuttal, and much of the paper itself. The paper provides an accessible analysis of the role of L2 normalization on two common "similarity measures". The main issue I find with the analysis is that the claim in lines 89-91 is circular reasoning -- is not the role of the analysis to show why L2 normalization is beneficial? The three architectural modifications suggested are reasonably tied with the analysis, and while each, by itself is not groundbreaking, the final method is powerful. Like some of the reviewers, I am concerned that the major contributor seems to be the FRN method.


HyNet: Learning Local Descriptor with Hybrid Similarity Measure and Triplet Loss

Neural Information Processing Systems

In this paper, we investigate how L2 normalisation affects the back-propagated descriptor gradients during training. Based on our observations, we propose HyNet, a new local descriptor that leads to state-of-the-art results in matching. HyNet introduces a hybrid similarity measure for triplet margin loss, a regularisation term constraining the descriptor norm, and a new network architecture that performs L2 normalisation of all intermediate feature maps and the output descriptors. HyNet surpasses previous methods by a significant margin on standard benchmarks that include patch matching, verification, and retrieval, as well as outperforming full end-to-end methods on 3D reconstruction tasks.